23 research outputs found

    Cost-Sensitive Boosting

    Full text link

    Differential Phase Estimation with the SeaMARC II Bathymetric Sidescan Sonar System

    Get PDF
    A maximum-likelihood estimator is used to extract differential phase measurements from noisy seafloor echoes received at pairs of transducers mounted on either side of the SeaMARC II bathymetricsidescan sonar system. Carrier frequencies for each side are about 1 kHz apart, and echoes from a transmitted pulse 2 ms long are analyzed. For each side, phase difference sequences are derived from the full complex data consisting of base-banded and digitized quadrature components of the received echoes. With less bias and a lower variance, this method is shown to be more efficient than a uniform mean estimator. It also does not exhibit the angular or time ambiguities commonly found in the histogram method used in the SeaMARC II system. A figure for the estimation uncertainty of the phasedifference is presented, and results are obtained for both real and simulated data. Based on this error estimate and an empirical verification derived through coherent ping stacking, a single filter length of 100 ms is chosen for data processing application

    CSNL: A cost-sensitive non-linear decision tree algorithm

    Get PDF
    This article presents a new decision tree learning algorithm called CSNL that induces Cost-Sensitive Non-Linear decision trees. The algorithm is based on the hypothesis that nonlinear decision nodes provide a better basis than axis-parallel decision nodes and utilizes discriminant analysis to construct nonlinear decision trees that take account of costs of misclassification. The performance of the algorithm is evaluated by applying it to seventeen datasets and the results are compared with those obtained by two well known cost-sensitive algorithms, ICET and MetaCost, which generate multiple trees to obtain some of the best results to date. The results show that CSNL performs at least as well, if not better than these algorithms, in more than twelve of the datasets and is considerably faster. The use of bagging with CSNL further enhances its performance showing the significant benefits of using nonlinear decision nodes. The performance of the algorithm is evaluated by applying it to seventeen data sets and the results are compared with those obtained by two well known cost-sensitive algorithms, ICET and MetaCost, which generate multiple trees to obtain some of the best results to date. The results show that CSNL performs at least as well, if not better than these algorithms, in more than twelve of the data sets and is considerably faster. The use of bagging with CSNL further enhances its performance showing the significant benefits of using non-linear decision nodes

    Calibrating AdaBoost for Asymmetric Learning

    No full text
    Abstract. Asymmetric classification problems are characterized by class imbalance or unequal costs for different types of misclassifications. One of the main cited weaknesses of AdaBoost is its perceived inability to handle asymmetric problems. As a result, a multitude of asymmetric versions of AdaBoost have been proposed, mainly as heuristic modifications to the original algorithm. In this paper we challenge this approach and propose instead handling asymmetric tasks by properly calibrating the scores of the original AdaBoost so that they correspond to probability estimates. We then account for the asymmetry using classic decision theoretic ap-proaches. Empirical comparisons of this approach against the most repre-sentative asymmetric Adaboost variants show that it compares favorably. Moreover, it retains the theoretical guarantees of the original AdaBoost and it can easily be adjusted to account for changes in class imbalance or costs without need for retraining
    corecore